Improving question answering performance using knowledge distillation and active learning
نویسندگان
چکیده
Contemporary question answering (QA) systems, including Transformer-based architectures, suffer from increasing computational and model complexity which render them inefficient for real-world applications with limited resources. Furthermore, training or even fine-tuning such models requires a vast amount of labeled data is often not available the task at hand. In this manuscript, we conduct comprehensive analysis mentioned challenges introduce suitable countermeasures. We propose novel knowledge distillation (KD) approach to reduce parameter pre-trained bidirectional encoder representations transformer (BERT) system utilize multiple active learning (AL) strategies immense reduction in annotation efforts. show efficacy our by comparing it four state-of-the-art (SOTA) Transformers-based namely KroneckerBERT, EfficientBERT, TinyBERT, DistilBERT. Specifically, outperform KroneckerBERT21 EfficientBERTTINY 4.5 0.4 percentage points EM, despite having 75.0% 86.2% fewer parameters, respectively. Additionally, achieves comparable performance 6-layer TinyBERT DistilBERT while using only 2% their total trainable parameters. Besides, integration AL approaches into BERT framework, that SOTA results on QA datasets can be achieved when use 40% data. Overall, all demonstrate effectiveness achieving performance, extremely reducing number parameters labeling Finally, make code publicly https://github.com/mirbostani/QA-KD-AL.
منابع مشابه
Improving Question Retrieval in Community Question Answering Using World Knowledge
Community question answering (cQA), which provides a platform for people with diverse background to share information and knowledge, has become an increasingly popular research topic. In this paper, we focus on the task of question retrieval. The key problem of question retrieval is to measure the similarity between the queried questions and the historical questions which have been solved by ot...
متن کاملAutomatic Medical Knowledge Acquisition Using Question-Answering
We aim at proposing a rule generation approach to automatically acquire structured rules that can be used in decision support systems for drug prescription. We apply a question-answering engine to answer specific information requests. The rule generation is seen as an equation problem, where the factors are known items of the rule (e.g., an infectious disease, caused by a given bacteria) and so...
متن کاملUsing Scenario Knowledge In Automatic Question Answering
This paper describes a novel framework for using scenario knowledge in opendomain Question Answering (Q/A) applications that uses a state-of-the-art textual entailment system (Hickl et al., 2006b) in order to discover textual information relevant to the set of topics associated with a scenario description. An intrinsic and an extrinsic evaluation of this method is presented in the context of an...
متن کاملKnowledge-Based Question Answering
This paper describes the Webclopedia Question Answering system, in which methods to automatically learn patterns and parameterizations are combined with hand-crafted rules and concept ontologies. The source for answers is a collection of 1 million newspaper texts, distributed by NIST. In general, two kinds of knowledge are used by Webclopedia to answer questions: knowledge about language and kn...
متن کاملKnowledge Based Question Answering
The n a t u r a l language d a t a b a s e query system i n c o r p o r a t e d in the KNOBS i n t e r a c t i v e p l a n n i n g sys t em compr i ses a d i c t i o n a r y d r i v e n p a r s e r , APE-II , and s c r i p t i n t e r p r e t e r which y i e l d a c o n c e p t u a l dependency c o n c e p t u a l i z a t i o n as a r e p r e s e n t a t i o n of the manning of u s e r i n p u ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Engineering Applications of Artificial Intelligence
سال: 2023
ISSN: ['1873-6769', '0952-1976']
DOI: https://doi.org/10.1016/j.engappai.2023.106137